Decomposing Overcomplete 3rd Order Tensors using Sum-of-Squares Algorithms
نویسندگان
چکیده
Tensor rank and low-rank tensor decompositions have many applications in learning and complexity theory. Most known algorithms use unfoldings of tensors and can only handle rank up to nbp/2c for a p-th order tensor in Rnp . Previously no efficient algorithm can decompose 3rd order tensors when the rank is super-linear in the dimension. Using ideas from sum-of-squares hierarchy, we give the first quasi-polynomial time algorithm that can decompose a random 3rd order tensor decomposition when the rank is as large as n3/2/ poly logn. We also give a polynomial time algorithm for certifying the injective norm of random low rank tensors. Our tensor decomposition algorithm exploits the relationship between injective norm and the tensor components. The proof relies on interesting tools for decoupling random variables to prove better matrix concentration bounds. 1998 ACM Subject Classification I.2.6 Learning
منابع مشابه
Spectral algorithms for tensor completion
Abstract. In the tensor completion problem, one seeks to estimate a low-rank tensor based on a random sample of revealed entries. In terms of the required sample size, earlier work revealed a large gap between estimation with unbounded computational resources (using, for instance, tensor nuclear norm minimization) and polynomial-time algorithms. Among the latter, the best statistical guarantees...
متن کاملIterative Methods for Symmetric Outer Product Tensor Decomposition
We study the symmetric outer product for tensors. Specifically, we look at decomposition of fully (partially) symmetric tensor into a sum of rank-one fully (partially) symmetric tensors. We present an iterative technique for the third-order partially symmetric tensor and fourthorder fully and partially symmetric tensor. We included several numerical examples which indicate a faster convergence ...
متن کاملLearning with Tensor Representation
Most of the existing learning algorithms take vectors as their input data. A function is then learned in such a vector space for classification, clustering, or dimensionality reduction. However, in some situations, there is reason to consider data as tensors. For example, an image can be considered as a second order tensor and a video can be considered as a third order tensor. In this paper, we...
متن کاملAn Incremental DC Algorithm for the Minimum Sum-of-Squares Clustering
Here, an algorithm is presented for solving the minimum sum-of-squares clustering problems using their difference of convex representations. The proposed algorithm is based on an incremental approach and applies the well known DC algorithm at each iteration. The proposed algorithm is tested and compared with other clustering algorithms using large real world data sets.
متن کاملDeveloping Tensor Operations with an Underlying Group Structure
Tensor computations frequently involve factoring or decomposing a tensor into a sum of rank-1 tensors (CANDECOMP-PARAFAC, HOSVD, etc.). These decompositions are often considered as different higher-order extensions of the matrix SVD. The HOSVD can be described using the n-mode product, which describes multiplication between a higher-order tensor and a matrix. Generalizing this multiplication le...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015